|
![](/i/fill.gif) |
Patrick Elliott wrote in message
<MPG.20243ea770f89bec989fd8@news.povray.org>:
> Sigh.. How about "blending" or "blurring" then, since the problem is
> that, basically, if you have a fine gradient that you "need" to have
> stay that way for the detail to be obvious, its going to screw things
> up. You lose data "period". It doesn't matter if it "technically" isn't
> messing up the white or black points, if it never the less has the
> unfortunate side effect of making it "look" like its doing so, by losing
> colors that *should* remain the same, just brighter or darker. Its what
> the perception is, not what may or may not actually be happening. And in
> 90% of cases, people attempt to use Gamma to correct for how "bright"
> the image is, not if green looks 0.01% more blue on Fred's display than
> on Ralph's.
Well, with discrete bounded color values, it is not possible to losslessly
increase the brightness. It is the pigeonhole principle.
Now, not being able to see the dark areas because there is not enough
contrast is a loss too. So the question is not loss / no loss, but between
different forms of loss. It is a matter of compromise. And under the
circumstances, a light gamma correction is a very good compromise.
Post a reply to this message
|
![](/i/fill.gif) |